SCOT Approximation, Training and Asymptotic Inference
نویسندگان
چکیده
Approximation of stationary strongly mixing processes by SCOT models and the Le CamHajek-Ibragimov-Khasminsky locally minimax theory of statistical inference for them is outlined. SCOT is an m-Markov model with sparse memory structure. In our previous papers we proved SCOT equivalence to 1-MC with state space—alphabet consisting of the SCOT contexts. For the fixed alphabet size and growing sample size, the Local Asymptotic Normality is proved and applied for establishing asymptotically optimal inference. We outline what obstacles arise for a large SCOT alphabet size and not necessarily vast sample size. Training SCOT on a large string using clusters of computers and statistical applications are described.
منابع مشابه
Inference about the Burr Type III Distribution under Type-II Hybrid Censored Data
This paper presents the statistical inference on the parameters of the Burr type III distribution, when the data are Type-II hybrid censored. The maximum likelihood estimators are developed for the unknown parameters using the EM algorithm method. We provided the observed Fisher information matrix using the missing information principle which is useful for constructing the asymptotic confidence...
متن کاملAsymptotic Behaviors of the Lorenz Curve for Left Truncated and Dependent Data
The purpose of this paper is to provide some asymptotic results for nonparametric estimator of the Lorenz curve and Lorenz process for the case in which data are assumed to be strong mixing subject to random left truncation. First, we show that nonparametric estimator of the Lorenz curve is uniformly strongly consistent for the associated Lorenz curve. Also, a strong Gaussian approximation for ...
متن کاملSCOT Modeling and Its Statistical Applications of Time Series
We modeled and applied Stochastic COntext Tree (SCOT) for statistical inference about financial, literary and seismological stationary strings. We analyzed several models viewed as simplified approaches to financial modeling: evaluate their stationary distribution, entropy rate and convergence to the Brownian motion. We also estimated the SCOT parameters and tested homogeneity of data strings u...
متن کاملFactorized Asymptotic Bayesian Inference for Mixture Modeling
This paper proposes a novel Bayesian approximation inference method for mixture modeling. Our key idea is to factorize marginal log-likelihood using a variational distribution over latent variables. An asymptotic approximation, a factorized information criterion (FIC), is obtained by applying the Laplace method to each of the factorized components. In order to evaluate FIC, we propose factorize...
متن کاملSome Asymptotic Results of Kernel Density Estimator in Length-Biased Sampling
In this paper, we prove the strong uniform consistency and asymptotic normality of the kernel density estimator proposed by Jones [12] for length-biased data.The approach is based on the invariance principle for the empirical processes proved by Horváth [10]. All simulations are drawn for different cases to demonstrate both, consistency and asymptotic normality and the method is illustrated by ...
متن کامل